<
jargon, programming> 1. In
source code, some non-obvious
constant whose value is significant to the operation of a
program and that is inserted inconspicuously in-line
(
hard-coded), rather than expanded in by a symbol set by a
commented "#define".
Magic numbers in this sense are bad
style.
2. A
number that encodes critical information used in an
algorithm in some opaque way. The classic examples of these
are the numbers used in
hash or
CRC functions or the
coefficients in a
linear congruential generator for
pseudorandom numbers. This sense actually predates, and
was ancestral to, the more common sense 1.
3. Special data located at the beginning of a
binary data
file to indicate its type to a utility. Under
Unix, the
system and various
applications programs (especially the
linker) distinguish between types of executable file by
looking for a
magic number. Once upon a time, these
magic
numbers were
PDP-11 branch instructions that skipped over
header data to the start of executable code; 0407, for
example, was
octal for "branch 16 bytes relative". Nowadays
only a
wizard knows the spells to create
magic numbers. {MS
DOS} executables begin with the
magic string "MZ".
*The*
magic number, on the other hand, is 7+/-2. The paper
cited below established the
number of distinct items (such as
numeric digits) that humans can hold in short-term memory.
Among other things, this strongly influenced the interface
design of the phone system.
[
"The magical number seven, plus or minus two: some limits on
our capacity for processing information", George Miller, in
the "Psychological Review" 63:81-97, 1956].
[
Jargon File]
(2003-07-02)